# Pre-training fine-tuning
Cross Encoder Binary Topic Classification
This is a cross-encoder model based on the Transformer architecture, primarily used for text ranking tasks.
Text Embedding
Transformers

C
enochlev
28
0
Sentence Flaubert Base
Apache-2.0
French sentence embedding model based on FlauBERT for calculating sentence similarity
Text Embedding French
S
Lajavaness
1,846
3
Rubert Base Cased Sentiment Rusentiment
A sentiment analysis model based on DeepPavlov/rubert-base-cased-conversational architecture, trained on the RuSentiment dataset, capable of identifying neutral, positive, and negative emotions in Russian text.
Text Classification Other
R
blanchefort
80.75k
12
Mbart Large 50 Many To One Mmt
A multilingual machine translation model fine-tuned based on mBART-large-50, supporting translation tasks from 50 languages to English
Machine Translation Supports Multiple Languages
M
facebook
28.70k
66
Roberta Base Chinese Extractive Qa
A Chinese extractive QA model based on the RoBERTa architecture, suitable for tasks that extract answers from given texts.
Question Answering System Chinese
R
uer
2,694
98
Codet5 Small Code Summarization Python
Apache-2.0
CodeT5-small is a Transformer-based code understanding and generation model developed by Salesforce, specifically fine-tuned for Python code-to-text tasks.
Text Generation
Transformers Supports Multiple Languages

C
stmnk
38
2
Featured Recommended AI Models